Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 9.707
Filter
1.
Curr Protoc ; 4(5): e1046, 2024 May.
Article in English | MEDLINE | ID: mdl-38717471

ABSTRACT

Whole-genome sequencing is widely used to investigate population genomic variation in organisms of interest. Assorted tools have been independently developed to call variants from short-read sequencing data aligned to a reference genome, including single nucleotide polymorphisms (SNPs) and structural variations (SVs). We developed SNP-SVant, an integrated, flexible, and computationally efficient bioinformatic workflow that predicts high-confidence SNPs and SVs in organisms without benchmarked variants, which are traditionally used for distinguishing sequencing errors from real variants. In the absence of these benchmarked datasets, we leverage multiple rounds of statistical recalibration to increase the precision of variant prediction. The SNP-SVant workflow is flexible, with user options to tradeoff accuracy for sensitivity. The workflow predicts SNPs and small insertions and deletions using the Genome Analysis ToolKit (GATK) and predicts SVs using the Genome Rearrangement IDentification Software Suite (GRIDSS), and it culminates in variant annotation using custom scripts. A key utility of SNP-SVant is its scalability. Variant calling is a computationally expensive procedure, and thus, SNP-SVant uses a workflow management system with intermediary checkpoint steps to ensure efficient use of resources by minimizing redundant computations and omitting steps where dependent files are available. SNP-SVant also provides metrics to assess the quality of called variants and converts between VCF and aligned FASTA format outputs to ensure compatibility with downstream tools to calculate selection statistics, which are commonplace in population genomics studies. By accounting for both small and large structural variants, users of this workflow can obtain a wide-ranging view of genomic alterations in an organism of interest. Overall, this workflow advances our capabilities in assessing the functional consequences of different types of genomic alterations, ultimately improving our ability to associate genotypes with phenotypes. © 2024 The Authors. Current Protocols published by Wiley Periodicals LLC. Basic Protocol: Predicting single nucleotide polymorphisms and structural variations Support Protocol 1: Downloading publicly available sequencing data Support Protocol 2: Visualizing variant loci using Integrated Genome Viewer Support Protocol 3: Converting between VCF and aligned FASTA formats.


Subject(s)
Polymorphism, Single Nucleotide , Software , Workflow , Polymorphism, Single Nucleotide/genetics , Computational Biology/methods , Genomics/methods , Molecular Sequence Annotation/methods , Whole Genome Sequencing/methods
2.
Nat Commun ; 15(1): 3922, 2024 May 09.
Article in English | MEDLINE | ID: mdl-38724498

ABSTRACT

Identification of differentially expressed proteins in a proteomics workflow typically encompasses five key steps: raw data quantification, expression matrix construction, matrix normalization, missing value imputation (MVI), and differential expression analysis. The plethora of options in each step makes it challenging to identify optimal workflows that maximize the identification of differentially expressed proteins. To identify optimal workflows and their common properties, we conduct an extensive study involving 34,576 combinatoric experiments on 24 gold standard spike-in datasets. Applying frequent pattern mining techniques to top-ranked workflows, we uncover high-performing rules that demonstrate optimality has conserved properties. Via machine learning, we confirm optimal workflows are indeed predictable, with average cross-validation F1 scores and Matthew's correlation coefficients surpassing 0.84. We introduce an ensemble inference to integrate results from individual top-performing workflows for expanding differential proteome coverage and resolve inconsistencies. Ensemble inference provides gains in pAUC (up to 4.61%) and G-mean (up to 11.14%) and facilitates effective aggregation of information across varied quantification approaches such as topN, directLFQ, MaxLFQ intensities, and spectral counts. However, further development and evaluation are needed to establish acceptable frameworks for conducting ensemble inference on multiple proteomics workflows.


Subject(s)
Proteomics , Proteomics/methods , Workflow , Machine Learning , Proteome/metabolism , Humans , Algorithms , Databases, Protein
3.
BMC Bioinformatics ; 25(1): 184, 2024 May 09.
Article in English | MEDLINE | ID: mdl-38724907

ABSTRACT

BACKGROUND: Major advances in sequencing technologies and the sharing of data and metadata in science have resulted in a wealth of publicly available datasets. However, working with and especially curating public omics datasets remains challenging despite these efforts. While a growing number of initiatives aim to re-use previous results, these present limitations that often lead to the need for further in-house curation and processing. RESULTS: Here, we present the Omics Dataset Curation Toolkit (OMD Curation Toolkit), a python3 package designed to accompany and guide the researcher during the curation process of metadata and fastq files of public omics datasets. This workflow provides a standardized framework with multiple capabilities (collection, control check, treatment and integration) to facilitate the arduous task of curating public sequencing data projects. While centered on the European Nucleotide Archive (ENA), the majority of the provided tools are generic and can be used to curate datasets from different sources. CONCLUSIONS: Thus, it offers valuable tools for the in-house curation previously needed to re-use public omics data. Due to its workflow structure and capabilities, it can be easily used and benefit investigators in developing novel omics meta-analyses based on sequencing data.


Subject(s)
Data Curation , Software , Workflow , Data Curation/methods , Metadata , Databases, Genetic , Genomics/methods , Computational Biology/methods
4.
J Pak Med Assoc ; 74(4 (Supple-4)): S109-S116, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38712418

ABSTRACT

Breast Cancer (BC) has evolved from traditional morphological analysis to molecular profiling, identifying new subtypes. Ki-67, a prognostic biomarker, helps classify subtypes and guide chemotherapy decisions. This review explores how artificial intelligence (AI) can optimize Ki-67 assessment, improving precision and workflow efficiency in BC management. The study presents a critical analysis of the current state of AI-powered Ki-67 assessment. Results demonstrate high agreement between AI and standard Ki-67 assessment methods highlighting AI's potential as an auxiliary tool for pathologists. Despite these advancements, the review acknowledges limitations such as the restricted timeframe and diverse study designs, emphasizing the need for further research to address these concerns. In conclusion, AI holds promise in enhancing Ki-67 assessment's precision and workflow efficiency in BC diagnosis. While challenges persist, the integration of AI can revolutionize BC care, making it more accessible and precise, even in resource-limited settings.


Subject(s)
Artificial Intelligence , Breast Neoplasms , Ki-67 Antigen , Workflow , Humans , Breast Neoplasms/metabolism , Breast Neoplasms/pathology , Breast Neoplasms/diagnosis , Ki-67 Antigen/metabolism , Female , Biomarkers, Tumor/metabolism
5.
Med Sci Monit ; 30: e943526, 2024 May 12.
Article in English | MEDLINE | ID: mdl-38734884

ABSTRACT

BACKGROUND A significant number of atrial fibrillation (AF) recurrences occur after initial ablation, often due to pulmonary vein reconnections or triggers from non-pulmonary veins. MATERIAL AND METHODS Patients with paroxysmal AF who underwent radiofrequency catheter ablation for the first time were enrolled. Base on propensity score matching (1: 1 matching), 118 patients were selected for an optimized workflow for the radiofrequency catheter ablation of paroxysmal AF (OWCA) group and a conventional group. Comparative analysis of the acute and 12-month clinical outcomes was conducted. Moreover, an artificial intelligence analytics platform was used to evaluate the quality of pulmonary vein isolation (PVI) circles. RESULTS PVI was successfully achieved in all patients. Incidence of first-pass isolation of bilateral PVI circles was higher (P=0.009) and acute pulmonary vein reconnections was lower (P=0.027) in the OWCA group than conventional group. The OWCA group displayed a significant reduction in the number of fractured points (P<0.001), stacked points (P=0.003), and a greater proportion of cases in which the radiofrequency index achieved the target value (P=0.003). Additionally, the contact force consistently met the force over time criteria (P<0.001) for bilateral PVI circles in the OWCA group, accompanied by a shorter operation time (P=0.017). During the 12-month follow-up period, the OWCA group exhibited a higher atrial arrhythmia-free survival rate following the initial ablation procedure than did the conventional group. CONCLUSIONS The optimized workflow for radiofrequency catheter ablation of paroxysmal AF could play a crucial role in creating higher quality PVI circles. This improvement is reflected in a significantly elevated 12-month atrial arrhythmia-free survival rate.


Subject(s)
Atrial Fibrillation , Catheter Ablation , Pulmonary Veins , Workflow , Humans , Atrial Fibrillation/surgery , Catheter Ablation/methods , Female , Male , Middle Aged , Treatment Outcome , Pulmonary Veins/surgery , Aged , Propensity Score , Recurrence
6.
Br Dent J ; 236(9): 718, 2024 May.
Article in English | MEDLINE | ID: mdl-38730170
7.
Br Dent J ; 236(9): 721, 2024 May.
Article in English | MEDLINE | ID: mdl-38730177

Subject(s)
Workflow , Humans , United Kingdom
8.
Sci Rep ; 14(1): 11018, 2024 05 14.
Article in English | MEDLINE | ID: mdl-38744902

ABSTRACT

Antibody-drug conjugates (ADC) payloads are cleavable drugs that act as the warhead to exert an ADC's cytotoxic effects on cancer cells intracellularly. A simple and highly sensitive workflow is developed and validated for the simultaneous quantification of six ADC payloads, namely SN-38, MTX, DXd, MMAE, MMAF and Calicheamicin (CM). The workflow consists of a short and simple sample extraction using a methanol-ethanol mixture, followed by a fast liquid chromatography tandem mass spectrometry (LC-MS/MS) analysis. The results showed that well-validated linear response ranges of 0.4-100 nM for SN38, MTX and DXd, 0.04-100 nM for MMAE and MMAF, 0.4-1000 nM for CM were achieved in mouse serum. Recoveries for all six payloads at three different concentrations (low, medium and high) were more than 85%. An ultra-low sample volume of only 5 µL of serum is required due to the high sensitivity of the method. This validated method was successfully applied to a pharmacokinetic study to quantify MMAE in mouse serum samples.


Subject(s)
Immunoconjugates , Tandem Mass Spectrometry , Animals , Mice , Chromatography, Liquid/methods , Immunoconjugates/pharmacokinetics , Immunoconjugates/chemistry , Tandem Mass Spectrometry/methods , Workflow , Liquid Chromatography-Mass Spectrometry
9.
J Pathol Clin Res ; 10(3): e12376, 2024 May.
Article in English | MEDLINE | ID: mdl-38738521

ABSTRACT

The identification of gene fusions has become an integral part of soft tissue and bone tumour diagnosis. We investigated the added value of targeted RNA-based sequencing (targeted RNA-seq, Archer FusionPlex) to our current molecular diagnostic workflow of these tumours, which is based on fluorescence in situ hybridisation (FISH) for the detection of gene fusions using 25 probes. In a series of 131 diagnostic samples targeted RNA-seq identified a gene fusion, BCOR internal tandem duplication or ALK deletion in 47 cases (35.9%). For 74 cases, encompassing 137 FISH analyses, concordance between FISH and targeted RNA-seq was evaluated. A positive or negative FISH result was confirmed by targeted RNA-seq in 27 out of 49 (55.1%) and 81 out of 88 (92.0%) analyses, respectively. While negative concordance was high, targeted RNA-seq identified a canonical gene fusion in seven cases despite a negative FISH result. The 22 discordant FISH-positive analyses showed a lower percentage of rearrangement-positive nuclei (range 15-41%) compared to the concordant FISH-positive analyses (>41% of nuclei in 88.9% of cases). Six FISH analyses (in four cases) were finally considered false positive based on histological and targeted RNA-seq findings. For the EWSR1 FISH probe, we observed a gene-dependent disparity (p = 0.0020), with 8 out of 35 cases showing a discordance between FISH and targeted RNA-seq (22.9%). This study demonstrates an added value of targeted RNA-seq to our current diagnostic workflow of soft tissue and bone tumours in 19 out of 131 cases (14.5%), which we categorised as altered diagnosis (3 cases), added precision (6 cases), or augmented spectrum (10 cases). In the latter subgroup, four novel fusion transcripts were found for which the clinical relevance remains unclear: NAB2::NCOA2, YAP1::NUTM2B, HSPA8::BRAF, and PDE2A::PLAG1. Overall, targeted RNA-seq has proven extremely valuable in the diagnostic workflow of soft tissue and bone tumours.


Subject(s)
Bone Neoplasms , In Situ Hybridization, Fluorescence , Soft Tissue Neoplasms , Workflow , Humans , Bone Neoplasms/genetics , Bone Neoplasms/diagnosis , Bone Neoplasms/pathology , Soft Tissue Neoplasms/genetics , Soft Tissue Neoplasms/diagnosis , Soft Tissue Neoplasms/pathology , Female , Adult , Male , Middle Aged , Adolescent , Aged , Sequence Analysis, RNA , Child , Young Adult , Gene Fusion , Biomarkers, Tumor/genetics , Child, Preschool , Aged, 80 and over , Oncogene Proteins, Fusion/genetics
10.
Anal Chim Acta ; 1307: 342574, 2024 Jun 08.
Article in English | MEDLINE | ID: mdl-38719419

ABSTRACT

BACKGROUND: Metabolomics is nowadays considered one the most powerful analytical for the discovery of metabolic dysregulations associated with the insurgence of cancer, given the reprogramming of the cell metabolism to meet the bioenergetic and biosynthetic demands of the malignant cell. Notwithstanding, several challenges still exist regarding quality control, method standardization, data processing, and compound identification. Therefore, there is a need for effective and straightforward approaches for the untargeted analysis of structurally related classes of compounds, such as acylcarnitines, that have been widely investigated in prostate cancer research for their role in energy metabolism and transport and ß-oxidation of fatty acids. RESULTS: In the present study, an innovative analytical platform was developed for the straightforward albeit comprehensive characterization of acylcarnitines based on high-resolution mass spectrometry, Kendrick mass defect filtering, and confirmation by prediction of their retention time in reversed-phase chromatography. In particular, a customized data processing workflow was set up on Compound Discoverer software to enable the Kendrick mass defect filtering, which allowed filtering out more than 90 % of the initial features resulting from the processing of 25 tumoral and adjacent non-malignant prostate tissues collected from patients undergoing radical prostatectomy. Later, a partial least square-discriminant analysis model validated by repeated double cross-validation was built on the dataset of 74 annotated acylcarnitines, with classification rates higher than 93 % for both groups, and univariate statistical analysis helped elucidate the individual role of the annotated metabolites. SIGNIFICANCE: Hydroxylation of short- and medium-chain minor acylcarnitines appeared to be a significant variable in describing tissue differences, suggesting the hypothesis that the neoplastic growth is linked to oxidation phenomena on selected metabolites and reinforcing the need for effective methods for the annotation of minor metabolites.


Subject(s)
Carnitine , Prostatic Neoplasms , Male , Carnitine/analogs & derivatives , Carnitine/metabolism , Carnitine/chemistry , Carnitine/analysis , Prostatic Neoplasms/metabolism , Prostatic Neoplasms/pathology , Humans , Workflow , Metabolomics , Mass Spectrometry
11.
PLoS One ; 19(5): e0302787, 2024.
Article in English | MEDLINE | ID: mdl-38718077

ABSTRACT

To monitor the sharing of research data through repositories is increasingly of interest to institutions and funders, as well as from a meta-research perspective. Automated screening tools exist, but they are based on either narrow or vague definitions of open data. Where manual validation has been performed, it was based on a small article sample. At our biomedical research institution, we developed detailed criteria for such a screening, as well as a workflow which combines an automated and a manual step, and considers both fully open and restricted-access data. We use the results for an internal incentivization scheme, as well as for a monitoring in a dashboard. Here, we describe in detail our screening procedure and its validation, based on automated screening of 11035 biomedical research articles, of which 1381 articles with potential data sharing were subsequently screened manually. The screening results were highly reliable, as witnessed by inter-rater reliability values of ≥0.8 (Krippendorff's alpha) in two different validation samples. We also report the results of the screening, both for our institution and an independent sample from a meta-research study. In the largest of the three samples, the 2021 institutional sample, underlying data had been openly shared for 7.8% of research articles. For an additional 1.0% of articles, restricted-access data had been shared, resulting in 8.3% of articles overall having open and/or restricted-access data. The extraction workflow is then discussed with regard to its applicability in different contexts, limitations, possible variations, and future developments. In summary, we present a comprehensive, validated, semi-automated workflow for the detection of shared research data underlying biomedical article publications.


Subject(s)
Biomedical Research , Workflow , Biomedical Research/methods , Humans , Information Dissemination/methods , Access to Information , Reproducibility of Results
12.
BMC Health Serv Res ; 24(1): 560, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38693492

ABSTRACT

BACKGROUND: The rapid evolution, complexity, and specialization of oncology treatment makes it challenging for physicians to provide care based on the latest and best evidence. We hypothesized that physicians would use evidence-based trusted care pathways if they were easy to use and integrated into clinical workflow at the point of care. METHODS: Within a large integrated care delivery system, we assembled clinical experts to define and update drug treatment pathways, encoded them as flowcharts in an online library integrated with the electronic medical record, communicated expectations that clinicians would use these pathways for every eligible patient, and combined data from multiple sources to understand usage over time. RESULTS: We were able to achieve > 75% utilization of eligible protocols ordered through these pathways within two years, with > 90% of individual oncologists having consulted the pathway at least once, despite no requirements or external incentives associated with pathway usage. Feedback from users contributed to improvements and updates to the guidance. CONCLUSIONS: By making our clinical decision support easily accessible and actionable, we find that we have made considerable progress toward our goal of having physicians consult the latest evidence in their treatment decisions.


Subject(s)
Critical Pathways , Decision Support Systems, Clinical , Electronic Health Records , Medical Oncology , Workflow , Humans , Evidence-Based Medicine
13.
Methods Cell Biol ; 187: 43-56, 2024.
Article in English | MEDLINE | ID: mdl-38705629

ABSTRACT

Correlative Light Electron Microscopy (CLEM) encompasses a wide range of experimental approaches with different degrees of complexity and technical challenges where the attributes of both light and electron microscopy are combined in a single experiment. Although the biological question always determines what technology is the most appropriate, we generally set out to apply the simplest workflow possible. For 2D cell cultures expressing fluorescently tagged molecules, we report on a simple and very powerful CLEM approach by using gridded finder imaging dishes. We first determine the gross localization of the fluorescence using light microscopy and subsequently we retrace the origin/localization of the fluorescence by projecting it onto the ultrastructural reference space obtained by transmission electron microscopy (TEM). Here we describe this workflow and highlight some basic principles of the sample preparation for such a simple CLEM experiment. We will specifically focus on the steps following the resin embedding for TEM and the introduction of the sample in the electron microscope.


Subject(s)
Workflow , Humans , Microscopy, Fluorescence/methods , Microscopy, Electron, Transmission/methods , Microscopy, Electron/methods , Animals
14.
J Robot Surg ; 18(1): 204, 2024 May 08.
Article in English | MEDLINE | ID: mdl-38714574

ABSTRACT

Workflow for cortical bone trajectory (CBT) screws includes tapping line-to-line or under tapping by 1 mm. We describe a non-tapping, two-step workflow for CBT screw placement, and compare the safety profile and time savings to the Tap (three-step) workflow. Patients undergoing robotic assisted 1-3 level posterior fusion with CBT screws for degenerative conditions were identified and separated into either a No-Tap or Tap workflow. Number of total screws, screw-related complications, estimated blood loss, operative time, robotic time, and return to the operating room were collected and analyzed. There were 91 cases (458 screws) in the No-Tap and 88 cases (466 screws) in the Tap groups, with no difference in demographics, revision status, ASA grade, approach, number of levels fused or diagnosis between cohorts. Total robotic time was lower in the No-Tap (26.7 min) versus the Tap group (30.3 min, p = 0.053). There was no difference in the number of malpositioned screws identified intraoperatively (10 vs 6, p = 0.427), screws converted to freehand (3 vs 3, p = 0.699), or screws abandoned (3 vs 2, p = 1.000). No pedicle/pars fracture or fixation failure was seen in the No-Tap cohort and one in the Tap cohort (p = 1.00). No patients in either cohort were returned to OR for malpositioned screws. This study showed that the No-Tap screw insertion workflow for robot-assisted CBT reduces robotic time without increasing complications.


Subject(s)
Cortical Bone , Robotic Surgical Procedures , Spinal Fusion , Humans , Robotic Surgical Procedures/methods , Robotic Surgical Procedures/instrumentation , Male , Female , Middle Aged , Cortical Bone/surgery , Aged , Spinal Fusion/methods , Spinal Fusion/instrumentation , Operative Time , Bone Screws , Workflow , Pedicle Screws , Adult
15.
Article in English | MEDLINE | ID: mdl-38717248

ABSTRACT

A video can help highlight the real-time steps, anatomy and the technical aspects of a case that may be difficult to convey with text or static images alone. Editing with a regimented workflow allows for the transmission of only essential information to the viewer while maximizing efficiency by going through the editing process. This video tutorial breaks down the fundamentals of surgical video editing with tips and pointers to simplify the workflow.


Subject(s)
Video Recording , Humans , Surgical Procedures, Operative/methods , Workflow
16.
Nat Commun ; 15(1): 3675, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38693118

ABSTRACT

The wide applications of liquid chromatography - mass spectrometry (LC-MS) in untargeted metabolomics demand an easy-to-use, comprehensive computational workflow to support efficient and reproducible data analysis. However, current tools were primarily developed to perform specific tasks in LC-MS based metabolomics data analysis. Here we introduce MetaboAnalystR 4.0 as a streamlined pipeline covering raw spectra processing, compound identification, statistical analysis, and functional interpretation. The key features of MetaboAnalystR 4.0 includes an auto-optimized feature detection and quantification algorithm for LC-MS1 spectra processing, efficient MS2 spectra deconvolution and compound identification for data-dependent or data-independent acquisition, and more accurate functional interpretation through integrated spectral annotation. Comprehensive validation studies using LC-MS1 and MS2 spectra obtained from standards mixtures, dilution series and clinical metabolomics samples have shown its excellent performance across a wide range of common tasks such as peak picking, spectral deconvolution, and compound identification with good computing efficiency. Together with its existing statistical analysis utilities, MetaboAnalystR 4.0 represents a significant step toward a unified, end-to-end workflow for LC-MS based global metabolomics in the open-source R environment.


Subject(s)
Mass Spectrometry , Metabolomics , Workflow , Algorithms , Chromatography, Liquid/methods , Liquid Chromatography-Mass Spectrometry , Mass Spectrometry/methods , Metabolomics/methods , Software
17.
Anal Chem ; 96(19): 7373-7379, 2024 May 14.
Article in English | MEDLINE | ID: mdl-38696819

ABSTRACT

Cross-linking mass spectrometry (XL-MS) has evolved into a pivotal technique for probing protein interactions. This study describes the implementation of Parallel Accumulation-Serial Fragmentation (PASEF) on timsTOF instruments, enhancing the detection and analysis of protein interactions by XL-MS. Addressing the challenges in XL-MS, such as the interpretation of complex spectra, low abundant cross-linked peptides, and a data acquisition bias, our current study integrates a peptide-centric approach for the analysis of XL-MS data and presents the foundation for integrating data-independent acquisition (DIA) in XL-MS with a vendor-neutral and open-source platform. A novel workflow is described for processing data-dependent acquisition (DDA) of PASEF-derived information. For this, software by Bruker Daltonics is used, enabling the conversion of these data into a format that is compatible with MeroX and Skyline software tools. Our approach significantly improves the identification of cross-linked products from complex mixtures, allowing the XL-MS community to overcome current analytical limitations.


Subject(s)
Cross-Linking Reagents , Mass Spectrometry , Software , Workflow , Cross-Linking Reagents/chemistry , Peptides/chemistry , Peptides/analysis , Humans
18.
Anal Chem ; 96(19): 7460-7469, 2024 May 14.
Article in English | MEDLINE | ID: mdl-38702053

ABSTRACT

Natural products (or specialized metabolites) are historically the main source of new drugs. However, the current drug discovery pipelines require miniaturization and speeds that are incompatible with traditional natural product research methods, especially in the early stages of the research. This article introduces the NP3 MS Workflow, a robust open-source software system for liquid chromatography-tandem mass spectrometry (LC-MS/MS) untargeted metabolomic data processing and analysis, designed to rank bioactive natural products directly from complex mixtures of compounds, such as bioactive biota samples. NP3 MS Workflow allows minimal user intervention as well as customization of each step of LC-MS/MS data processing, with diagnostic statistics to allow interpretation and optimization of LC-MS/MS data processing by the user. NP3 MS Workflow adds improved computing of the MS2 spectra in an LC-MS/MS data set and provides tools for automatic [M + H]+ ion deconvolution using fragmentation rules; chemical structural annotation against MS2 databases; and relative quantification of the precursor ions for bioactivity correlation scoring. The software will be presented with case studies and comparisons with equivalent tools currently available. NP3 MS Workflow shows a robust and useful approach to select bioactive natural products from complex mixtures, improving the set of tools available for untargeted metabolomics. It can be easily integrated into natural product-based drug-discovery pipelines and to other fields of research at the interface of chemistry and biology.


Subject(s)
Biological Products , Drug Discovery , Metabolomics , Software , Tandem Mass Spectrometry , Biological Products/chemistry , Biological Products/metabolism , Biological Products/analysis , Chromatography, Liquid/methods , Workflow
19.
Molecules ; 29(9)2024 May 01.
Article in English | MEDLINE | ID: mdl-38731577

ABSTRACT

Recently, benchtop nuclear magnetic resonance (NMR) spectrometers utilizing permanent magnets have emerged as versatile tools with applications across various fields, including food and pharmaceuticals. Their efficacy is further enhanced when coupled with chemometric methods. This study presents an innovative approach to leveraging a compact benchtop NMR spectrometer coupled with chemometrics for screening honey-based food supplements adulterated with active pharmaceutical ingredients. Initially, fifty samples seized by French customs were analyzed using a 60 MHz benchtop spectrometer. The investigation unveiled the presence of tadalafil in 37 samples, sildenafil in 5 samples, and a combination of flibanserin with tadalafil in 1 sample. After conducting comprehensive qualitative and quantitative characterization of the samples, we propose a chemometric workflow to provide an efficient screening of honey samples using the NMR dataset. This pipeline, utilizing partial least squares discriminant analysis (PLS-DA) models, enables the classification of samples as either adulterated or non-adulterated, as well as the identification of the presence of tadalafil or sildenafil. Additionally, PLS regression models are employed to predict the quantitative content of these adulterants. Through blind analysis, this workflow allows for the detection and quantification of adulterants in these honey supplements.


Subject(s)
Dietary Supplements , Honey , Magnetic Resonance Spectroscopy , Honey/analysis , Dietary Supplements/analysis , Magnetic Resonance Spectroscopy/methods , Sildenafil Citrate/analysis , Workflow , Chemometrics/methods , Tadalafil/analysis , Least-Squares Analysis , Drug Contamination/prevention & control , Discriminant Analysis
20.
J Appl Clin Med Phys ; 25(5): e14344, 2024 May.
Article in English | MEDLINE | ID: mdl-38615273

ABSTRACT

PURPOSE: Radiotherapy (RT) treatment and treatment planning is a complex process prepared and delivered by a multidisciplinary team of specialists. Efficient communication and notification systems among different team members are therefore essential to ensure the safe, timely delivery of treatments to patients. METHOD: To address this issue, we developed and implemented automated notification systems and an electronic whiteboard to track every CT simulation, contouring task, the new-start schedule, and physician's appointments and tasks, and notify team members of overdue and missing tasks and appointments. The electronic whiteboard was developed to have a straightforward view of current patients' planning workflow and to help different team members coordinate with each other. The systems were implemented and have been used at our center to monitor the progress of treatment-planning tasks for over 2 years. RESULTS: The last-minute plans were relatively reduced by about 40% in 2023 compared to 2021 and 2022 with a p-value < 0.05. The overdue contouring tasks of more than 1 day decreased from 46.8% in 2019 and 33.6% in 2020 to 20%-26.4% in 2021-2023 with a p-value < 0.05 after the implementation of the notification system. The rate of plans with 1-3 day planning time decreased by 20.31%, 39.32%, and 24.08% with a p-value < 0.05 and the rate of plans with 1-3 day planning time due to the contouring task overdue more than 1 day decreased by 49.49%, 56.89%, and 46.52% with a p-value < 0.05 after the implementation. The rate of outstanding appointments that are overdue by more than 7 days decreased by more than 5% with a p-value < 0.05 following the implementation of the system. CONCLUSIONS: Our experience shows that this system requires minimal human intervention, improves the treatment planning workflow and process by reducing errors and delays in the treatment planning process, positively impacts on-time treatment plan completion, and reduces the need for compressed or rushed treatment planning timelines.


Subject(s)
Neoplasms , Radiotherapy Dosage , Radiotherapy Planning, Computer-Assisted , Humans , Radiotherapy Planning, Computer-Assisted/methods , Neoplasms/radiotherapy , Radiotherapy, Intensity-Modulated/methods , Workflow , Tomography, X-Ray Computed/methods
SELECTION OF CITATIONS
SEARCH DETAIL
...